• Àüü
  • ÀüÀÚ/Àü±â
  • Åë½Å
  • ÄÄÇ»ÅÍ
´Ý±â

»çÀÌÆ®¸Ê

Loading..

Please wait....

±¹³» ÇÐȸÁö

Ȩ Ȩ > ¿¬±¸¹®Çå > ±¹³» ÇÐȸÁö > µ¥ÀÌÅͺ£À̽º ¿¬±¸È¸Áö(SIGDB)

µ¥ÀÌÅͺ£À̽º ¿¬±¸È¸Áö(SIGDB)

Current Result Document :

ÇѱÛÁ¦¸ñ(Korean Title) »çÀü ÇнÀ ¾ð¾î ¸ðµ¨¿¡¼­ÀÇ ¿ÜºÎ Áö½Ä È°¿ë ±â¼ú µ¿Çâ
¿µ¹®Á¦¸ñ(English Title) A survey on the recent techniques for extending external knowledge in pre-trained language models
ÀúÀÚ(Author) À¯ÁÖ¿¬   ¾çµ¿Çå   ÀÌ°æÇÏ   Ju-Yeon Yu   Donghun Yang   Kyong-Ha Lee  
¿ø¹®¼ö·Ïó(Citation) VOL 39 NO. 01 PP. 0033 ~ 0058 (2023. 04)
Çѱ۳»¿ë
(Korean Abstract)
ÃÖ±ÙÀÇ »çÀü ÇнÀ ¾ð¾î ¸ðµ¨µéÀº ´Ù¾çÇÑ ÀÚ¿¬¾î ó¸® Å½ºÅ©µéÀÇ Ã³¸®¿¡ ÀÖ¾î ±âÁ¸ ¸ðµ¨ ´ëºñ ¿ì¼öÇÑ ¼º´É À» º¸ÀδÙ. ±×·¯³ª ´ëºÎºÐÀÇ »çÀü ÇнÀ ¾ð¾î ¸ðµ¨Àº À§Å°Çǵð¾Æ³ª ´º½º ±â»ç¿Í °°Àº ÀϹÝÀûÀÎ ÇнÀ ÄÚÆÛ½º ±â¹ÝÀ¸·Î ÇнÀµÇ¹Ç·Î ÀǷᳪ °úÇаú °°Àº ƯÁ¤ ºÐ¾ßÀÇ Å½ºÅ©¿¡¼­´Â ºñ±³Àû ³·Àº ¼º´ÉÀ» º¸ÀδÙ. ¶ÇÇÑ, »çÀü ÇнÀ ¾ð¾î ¸ðµ¨Àº ÀϹÝÀûÀ¸·Î ¹®¼­µé·ÎºÎÅÍ ÃßÃâÇÑ ÇнÀ ÄÚÆÛ½º¸¦ »ç¿ëÇϱ⠶§¹®¿¡ Àΰú °ü°è ¹× Ãß·ÐÀ» À§ ÇÑ Á¤º¸´Â ÇнÀÇϱ⠾î·Á¿ö ³í¸®ÀûÀ¸·Î Çؼ®Çϴµ¥ Ãë¾àÇÑ Æ¯Â¡À» º¸ÀδÙ. À̸¦ ÇØ°áÇϱâ À§ÇØ ÇнÀ ÄÚÆÛ½º ÀÌ¿Ü¿¡ ±âÁ¸¿¡ Á¸ÀçÇÏ´ø Ãß°¡ÀûÀÎ Áö½Ä Á¤º¸¸¦ ÀÌ¿ëÇÏ¿© »çÀü ÇнÀ ¾ð¾î ¸ðµ¨ÀÇ ¼º´É Çâ»óÀ» ¸ñÇ¥·Î ÇÏ´Â ¿¬±¸µéÀÌ È°¹ßÈ÷ ÀÌ·ç¾îÁö°í ÀÖ´Ù. ÀÌ·¯ÇÑ ´ëÇ¥ÀûÀÎ Áö½Ä Á¤º¸·Î´Â Áö½Ä ±×·¡ÇÁ³ª ´Ù¸¥ ¿ÜºÎ ¹®¼­ µîÀÌ Á¸Àç ÇÑ´Ù. ÀÌ³í¹®¿¡¼­´Â¿ÜºÎÁö½ÄÀ»È°¿ëÇÑ»çÀüÇнÀ¾ð¾î¸ðµ¨ÀÇÈ®Àå¿¡°üÇÑÃֽűâ¼úµ¿ÇâÀ»ÆľÇÇÏ°í Á¦±âµÇ´Â ¹ÌÇØ°á À̽´¿Í µµÀü¿¡ ´ëÇؼ­µµ ³íÀÇÇÏ°íÀÚ ÇÑ´Ù.
¿µ¹®³»¿ë
(English Abstract)
Pre-trained language models (PLMs) have recently exhibited superior performance compared to existing NLP models in various natural language processing tasks. However, most languaes models are trained with a general training corpus such as Wikipedia or news articles, so that they exhibit relatively low performance in specific fields such as medicine or science. In addition, language models are poor at providing causal relationships and inferences, making them vulnerable to logical interpretation. Studies aimed at improving the performance of PLMs by using the existing additional knowledge resources other than the training corpus have been actively performed in recent years. Knowledge graphs and external documents are the main external knowledge that we can easily acquire. This paper intends to assist in understanding the latest technological trends in the extension of PLMs using external knowledge resources. We also discuss open issues and challenges posed by the techniques.
Å°¿öµå(Keyword) µö·¯´×   ÀΰøÁö´É   ÀÚ¿¬¾î 󸮠  ¾ð¾î ¸ðµ¨   Áö½Ä ±×·¡ÇÁ   Deep Learning   Artificial Intelligence   Natural Language Processing   Language Model   Knowledge Graph  
ÆÄÀÏ÷ºÎ PDF ´Ù¿î·Îµå